3,307 research outputs found

    Inconsistent use of gesture space during abstract pointing impairs language comprehension

    Get PDF
    Pointing toward concrete objects is a well-known and efficient communicative strategy. Much less is known about the communicative effectiveness of abstract pointing where the pointing gestures are directed to “empty space.” McNeill's (2003) observations suggest that abstract pointing can be used to establish referents in gesture space, without the referents being physically present. Recently, however, it has been shown that abstract pointing typically provides redundant information to the uttered speech thereby suggesting a very limited communicative value (So et al., 2009). In a first approach to tackle this issue we were interested to know whether perceivers are sensitive at all to this gesture cue or whether it is completely discarded as irrelevant add-on information. Sensitivity to for instance a gesture-speech mismatch would suggest a potential communicative function of abstract pointing. Therefore, we devised a mismatch paradigm in which participants watched a video where a female was interviewed on various topics. During her responses, she established two concepts in space using abstract pointing (e.g., pointing to the left when saying Donald, and pointing to the right when saying Mickey). In the last response to each topic, the pointing gesture accompanying a target word (e.g., Donald) was either consistent or inconsistent with the previously established location. Event related brain potentials showed an increased N400 and P600 when gesture and speech referred to different referents, indicating that inconsistent use of gesture space impairs language comprehension. Abstract pointing was found to influence comprehension even though gesture was not crucial to understanding the sentences or conducting the experimental task. These data suggest that a referent was retrieved via abstract pointing and that abstract pointing can potentially be used for referent indication in a discourse. We conclude that abstract pointing has a potential communicative function

    Action comprehension: deriving spatial and functional relations.

    Get PDF
    A perceived action can be understood only when information about the action carried out and the objects used are taken into account. It was investigated how spatial and functional information contributes to establishing these relations. Participants observed static frames showing a hand wielding an instrument and a potential target object of the action. The 2 elements could either match or mismatch, spatially or functionally. Participants were required to judge only 1 of the 2 relations while ignoring the other. Both irrelevant spatial and functional mismatches affected judgments of the relevant relation. Moreover, the functional relation provided a context for the judgment of the spatial relation but not vice versa. The results are discussed in respect to recent accounts of action understanding

    Don’t Get Me Wrong: ERP Evidence from Cueing Communicative Intentions

    Get PDF
    How to make sure that one’s utterances are understood as intended when not facing each other? In order to convey communicative intentions, in digital communication emoticons and pragmatic cues are frequently used. Such cueing becomes even more crucial for implied interpretations (e.g., irony) that cannot be understood literally, but require extra information. Sentences, such as ‘That’s fantastic,’ may achieve either a literal or ironic meaning depending on the contextual constraints. In two experiments using event-related brain potentials (ERPs), we examined the effects of cueing communicative intentions (i.e., by means of quotation marks) on ironic and literal language comprehension. An impact of cueing on language processing was seen as early as 200 ms post-stimulus onset by the emergence of a P300 preceding a sustained positivity for cued irony relative to literal language, while for uncued irony a P200-P600 pattern was obtained. In presence of additional information for ironic intentions, pragmatic reanalysis allowing inferences on the message level may have occured immediately. Moreover, by examining the way of cueing (i.e., ambiguous vs. unambiguous cueing) this type of information for communicative intentions appeared to be only effective when the cues were unambiguous by matching pragmatic conventions. The findings suggest that cueing communicative intentions may immediately affect language comprehension, albeit depending on pragmatic conventions of the cues’ usage

    Gesture Facilitates the Syntactic Analysis of Speech

    Get PDF
    Recent research suggests that the brain routinely binds together information from gesture and speech. However, most of this research focused on the integration of representational gestures with the semantic content of speech. Much less is known about how other aspects of gesture, such as emphasis, influence the interpretation of the syntactic relations in a spoken message. Here, we investigated whether beat gestures alter which syntactic structure is assigned to ambiguous spoken German sentences. The P600 component of the Event Related Brain Potential indicated that the more complex syntactic structure is easier to process when the speaker emphasizes the subject of a sentence with a beat. Thus, a simple flick of the hand can change our interpretation of who has been doing what to whom in a spoken sentence. We conclude that gestures and speech are integrated systems. Unlike previous studies, which have shown that the brain effortlessly integrates semantic information from gesture and speech, our study is the first to demonstrate that this integration also occurs for syntactic information. Moreover, the effect appears to be gesture-specific and was not found for other stimuli that draw attention to certain parts of speech, including prosodic emphasis, or a moving visual stimulus with the same trajectory as the gesture. This suggests that only visual emphasis produced with a communicative intention in mind (that is, beat gestures) influences language comprehension, but not a simple visual movement lacking such an intention

    Left Motor delta Oscillations Reflect Asynchrony Detection in Multisensory Speech Perception

    Get PDF
    During multisensory speech perception, slow delta oscillations (∼1 - 3 Hz) in the listener's brain synchronize with the speech signal, likely engaging in speech signal decomposition. Notable fluctuations in the speech amplitude envelope, resounding speaker prosody, temporally align with articulatory and body gestures and both provide complementary sensations that temporally structure speech. Further, delta oscillations in the left motor cortex seem to align with speech and musical beats, suggesting their possible role in the temporal structuring of (quasi)-rhythmic stimulation. We extended the role of delta oscillations to audio-visual asynchrony detection as a test case of the temporal analysis of multisensory prosody fluctuations in speech. We recorded EEG responses in an audio-visual asynchrony detection task while participants watched videos of a speaker. We filtered the speech signal to remove verbal content and examined how visual and auditory prosodic features temporally (mis-)align. Results confirm (i) that participants accurately detected audio-visual asynchrony, and (ii) increased delta power in the left motor cortex in response to audio-visual asynchrony. The difference of delta power between asynchronous and synchronous conditions predicted behavioural performance, and (iii) decreased delta-beta coupling in the left motor cortex when listeners could not accurately map visual and auditory prosodies. Finally, both behavioural and neurophysiological evidence was altered when a speaker's face was degraded by a visual mask. Together, these findings suggest that motor delta oscillations support asynchrony detection of multisensory prosodic fluctuation in speech.SIGNIFICANCE STATEMENTSpeech perception is facilitated by regular prosodic fluctuations that temporally structure the auditory signal. Auditory speech processing involves the left motor cortex and associated delta oscillations. However, visual prosody (i.e., a speaker's body movements) complements auditory prosody, and it is unclear how the brain temporally analyses different prosodic features in multisensory speech perception. We combined an audio-visual asynchrony detection task with electroencephalographic recordings to investigate how delta oscillations support the temporal analysis of multisensory speech. Results confirmed that asynchrony detection of visual and auditory prosodies leads to increased delta power in left motor cortex and correlates with performance. We conclude that delta oscillations are invoked in an effort to resolve denoted temporal asynchrony in multisensory speech perception

    Neural correlates of the processing of co-speech gestures

    Get PDF
    In communicative situations, speech is often accompanied by gestures. For example, speakers tend to illustrate certain contents of speech by means of iconic gestures which are hand movements that bear a formal relationship to the contents of speech. The meaning of an iconic gesture is determined both by its form as well as the speech context in which it is performed. Thus, gesture and speech interact in comprehension. Using fMRI, the present study investigated what brain areas are involved in this interaction process. Participants watched videos in which sentences containing an ambiguous word (e.g. She touched the mouse) were accompanied by either a meaningless grooming movement, a gesture supporting the more frequent dominant meaning (e.g. animal) or a gesture supporting the less frequent subordinate meaning (e.g. computer device). We hypothesized that brain areas involved in the interaction of gesture and speech would show greater activation to gesture-supported sentences as compared to sentences accompanied by a meaningless grooming movement. The main results are that when contrasted with grooming, both types of gestures (dominant and subordinate) activated an array of brain regions consisting of the left posterior superior temporal sulcus (STS), the inferior parietal lobule bilaterally and the ventral precentral sulcus bilaterally. Given the crucial role of the STS in audiovisual integration processes, this activation might reflect the interaction between the meaning of gesture and the ambiguous sentence. The activations in inferior frontal and inferior parietal regions may reflect a mechanism of determining the goal of co-speech hand movements through an observation-execution matching process

    Meson model for f_0(980) production in peripheral pion-nucleon reactions

    Get PDF
    The Juelich model for pion-pion-scattering, based on an effective meson-meson Lagrangian is applied to the analysis of the S-wave production amplitudes derived from the BNL E852 experiment pi^- p -> pi^0 pi^0 n for a pion momentum of 18.3 GeV. The unexpected strong dependence of the S-wave partial wave amplitude on the momentum transfer between the proton and neutron in the vicinity of the f_0(980) resonance is explained in our analysis as interference effect between the correlated and uncorrelated pi^0 pi^0 pairs.Comment: 6 pages, 7 figures, formulas added, typos removed, new figure

    Regulated dicing of pre-mir-144 via reshaping of its terminal loop.

    Get PDF
    Although the route to generate microRNAs (miRNAs) is often depicted as a linear series of sequential and constitutive cleavages, we now appreciate multiple alternative pathways as well as diverse strategies to modulate their processing and function. Here, we identify an unusually profound regulatory role of conserved loop sequences in vertebrate pre-mir-144, which are essential for its cleavage by the Dicer RNase III enzyme in human and zebrafish models. Our data indicate that pre-mir-144 dicing is positively regulated via its terminal loop, and involves the ILF3 complex (NF90 and its partner NF45/ILF2). We provide further evidence that this regulatory switch involves reshaping of the pre-mir-144 apical loop into a structure that is appropriate for Dicer cleavage. In light of our recent findings that mir-144 promotes the nuclear biogenesis of its neighbor mir-451, these data extend the complex hierarchy of nuclear and cytoplasmic regulatory events that can control the maturation of clustered miRNAs

    Policy, Performativity and Partnership: an Ethical Leadership Perspective

    Get PDF
    This article identifies the need to think differently about educational partnerships in a changing and turbulent post compulsory policy environment in England. The policy and institutional contexts in which universities and colleges currently operate seem to be fuelling performativity at the expense of educational values. There appears to be a sharp interruption in the steady increase in educational partnerships as a vehicle for increasing and widening participation in higher education. We are witnessing a marked change in university / college relationships that appears to be a consequence of government calling a halt to increased participation in higher education, creating an increasingly competitive market for a more limited pool of student places. The implication that educational policy at the national level determines a particular pattern or mode of leadership decision making throughout an institution should however be resisted. Policy developments that challenge the moral precepts of education should not be allowed to determine how a leader acts, rather they should prompt actions that are truly educational, rooted in morality, and atached to identifiable educational values. Educational leaders have agency to resist restricted discourses in favour of ethical and principled change strategies that are a precondition for sustainable transformative partnerships in post compulsory education. University leaders in particular are called upon to use their considerable influence to resist narrow policy or managerial instrumentalism or performativity and embrace alternatives that are both educationally worthwhile and can enhance institutional resilience
    corecore